149 research outputs found

    Relationship between covariance of Wigner functions and transformation noncontextuality

    Full text link
    We relate two notions of classicality for quantum transformations often arising in popular subtheories of quantum mechanics: covariance of the Wigner representation of the theory and the existence of a transformation noncontextual ontological model of the theory. We show that covariance implies transformation noncontextuality. The converse holds provided that the underlying ontological model is the one given by the Wigner representation. In addition, we investigate the relationships of covariance and transformation noncontextuality with the existence of a positivity preserving quasiprobability distribution for the transformations of the theory. We conclude that covariance implies transformation noncontextuality, which implies positivity preservation. Therefore the violation of the latter is a stronger notion of nonclassicality than the violation of the former.Comment: 11 pages, 2 figure

    A Simulation-Based Study on Driver Behavior when Negotiating Curves with Sight Limitations

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    A mathematical framework for operational fine tunings

    Get PDF
    In the framework of ontological models, the features of quantum mechanics that emerge as inherently nonclassical always involve properties that are fine tuned, i.e. properties that hold at the operational level but break at the ontological level (they only hold for fine tuned values of the ontic parameters). Famous examples of such features are contextuality and nonlocality. We here develop a precise theory-independent mathematical framework for characterizing operational fine tunings. These are distinct from causal fine tunings - already introduced by Wood and Spekkens in [NJP,17 033002(2015)] - as they do not involve any assumption on the underlying causal structure. We show how all the already known examples of operational fine tunings fit into our framework, we discuss possibly new fine tunings and we use the framework to shed new light on the relation between nonlocality and generalized contextuality, where the former can involve a causal fine tuning too, unlike the latter. The framework is also formulated in the language of category theory and functors.Comment: 26 pages, 6 figure

    Tsirelson's bound and Landauer's principle in a single-system game

    Get PDF
    We introduce a simple single-system game inspired by the Clauser-Horne-Shimony-Holt (CHSH) game. For qubit systems subjected to unitary gates and projective measurements, we prove that any strategy in our game can be mapped to a strategy in the CHSH game, which implies that Tsirelson's bound also holds in our setting. More generally, we show that the optimal success probability depends on the reversible or irreversible character of the gates, the quantum or classical nature of the system and the system dimension. We analyse the bounds obtained in light of Landauer's principle, showing the entropic costs of the erasure associated with the game. This shows a connection between the reversibility in fundamental operations embodied by Landauer's principle and Tsirelson's bound, that arises from the restricted physics of a unitarily-evolving single-qubit system.Comment: 7 pages, 5 figures, typos correcte

    On next-to-eikonal corrections to threshold resummation for the Drell-Yan and DIS cross sections

    Get PDF
    We study corrections suppressed by one power of the soft gluon energy to the resummation of threshold logarithms for the Drell-Yan cross section and for Deep Inelastic structure functions. While no general factorization theorem is known for these next-to-eikonal (NE) corrections, it is conjectured that at least a subset will exponentiate, along with the logarithms arising at leading power. Here we develop some general tools to study NE logarithms, and we construct an ansatz for threshold resummation that includes various sources of NE corrections, implementing in this context the improved collinear evolution recently proposed by Dokshitzer, Marchesini and Salam (DMS). We compare our ansatz to existing exact results at two and three loops, finding evidence for the exponentiation of leading NE logarithms and confirming the predictivity of DMS evolution.Comment: 17 page

    Non-classicality as a computational resource

    Get PDF
    One of the main questions in the field of quantum computation is where the quantum computational speed-up comes from. Recent studies in the field of quantum foundations have suggested which are the features to be considered as inherently non-classical. One of the major contributions in this direction comes from a result known as Spekkens' toy theory, which is a model built to reproduce quantum theory as a classical phase-space-inspired theory with restrictions on what an observer can know about reality. The model reproduces many of the features of quantum mechanics, but it does not reproduce non-locality and contextuality. In this thesis we first complete Spekkens' toy theory with measurement update rules and a mathematical framework that generalises it to systems of any finite dimensions (prime and non-prime). We also extend the operational equivalence between the toy theory and stabilizer quantum mechanics to all odd dimensions via Gross' Wigner functions. We then use the toy theory to represent the non-contextual and classically simulatable part of the computation in state-injection schemes of quantum computation where contextuality is a resource. In the case of qubits, we show that the subtheories of quantum mechanics represented in the toy model can achieve the full stabilizer theory via state-injection and we associate different proofs of contextuality to different injection processes. Stepping back from Spekkens' toy theory, we conclude by focusing on single system protocols that compute non-linear functions (similarly to the popular CHSH game) which show quantum advantages even in absence of non-locality and contextuality (in its standard notions). We analyse their performances (formalised in Bell's and Tsirelson's bounds) in relation to Landauer's principle, which associates entropic costs to irreversible computations, and to a new notion of contextuality for sequences of transformations

    Relating compatibility and divisibility of quantum channels

    Full text link
    We connect two key concepts in quantum information: compatibility and divisibility of quantum channels. Two channels are compatible if they can be both obtained via marginalization from a third channel. A channel divides another channel if it reproduces its action by sequential composition with a third channel. (In)compatibility is of central importance for studying the difference between classical and quantum dynamics. The relevance of divisibility stands in its close relationship with the onset of Markovianity. We emphasize the simulability character of compatibility and divisibility, and, despite their structural difference, we find a set of channels -- self-degradable channels -- for which the two notions coincide. We also show that, for degradable channels, compatibility implies divisibility, and that, for anti-degradable channels, divisibility implies compatibility. These results motivate further research on these classes of channels and shed new light on the meaning of these two largely studied notions.Comment: Suggestions are welcome! =
    corecore